Tinfoil hat
BRIDGET BENNETT/AFP via Getty Images
  • A new study of YouTube’s recommendation algorithms shows the filter bubble is in full effect.
  • A user’s history watching misinformation about key conspiracy theories results in more videos being pumped toward them.
  • But one exception is lies about vaccines. “YouTube might be reacting differently to different topics based on the pressures they’re getting from the media,” said the author.
  • Visit Business Insider’s homepage for more stories.

YouTube’s propensity to push people down rabbit holes has been repeatedly probed by journalists and academics alike, and a new research paper shows the filter bubble in action.

Tanushree Mitra and colleagues at Virginia Tech University’s department of computer science analyzed the way YouTube’s algorithmic recommendations pushed videos on lightning rod topics for conspiracy theories.

The academics looked at how YouTube suggests videos related to 9/11 conspiracy theories, chem trails, the idea the Earth is flat, that we didn’t land on the Moon, and that vaccines are harmful or don’t work.

YouTube
Mateusz Slodkowski/Getty Images

“We saw all these media reports and opinion pieces talking about how YouTube is driving people down the rabbit hole,” said Mitra. “But I was like: ‘All these reports are talking without any empirical evidence. Is this actually happening?'”

They gathered 56,475 videos on those five topics and audited YouTube’s search and recommendation algorithms.

They created bot accounts on YouTube that then engaged with those topics and videos by watching them and searching for them.

The search audit the researchers conducted involves the bot accounts searching for videos around a particular topic using common search terms, and seeing what is recommended by YouTube's search algorithm.

They found that YouTube was better at pulling people out of the anti-vaccine rabbit hole than other conspiracy subjects.

"No matter how much you search for anti-vaccines, or if a user goes and searches for anti-vaccine videos, the resulting recommendations from the algorithm would still be pointing them to debunking videos, or pro-vaccine videos," said Mitra. "That's not the case for other ones, which potentially proves it'll push you down the rabbit hole if you're looking for chem trails, but not for vaccines."

A similar watch audit involved the bot accounts watching different types of videos related to each topic.

One set of bot accounts would watch solely anti-vaccine videos; another would watch videos debunking anti-vaccine conspiracy theories; and a third would consume a video diet that both supported and punctured misinformation about vaccines.

"We found even if the behaviour is watching anti-vaccine videos, the algorithm still gives pro-vaccine recommendations on the Up Next and top five recommendations sidebar, which was not the case for the other topics," she said. "That's where the difference lies between vaccine topics and the other topics we audited."

Mitra hypothesizes that YouTube is more proactively policing anti-vaccine videos given the current importance of the topic to the world's battle against coronavirus.

"A lot of these media articles are initially about how these platforms in general are pushing people towards vaccine controversies," she said, "so it's not surprising that's the first topic they want to tackle, and the other ones aren't a high priority for them."

A YouTube spokesperson said: "We're committed to providing timely and helpful information, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, to help combat misinformation.

They added: "We also have clear policies that prohibit videos that encourage harmful or dangerous content, impersonation or hate speech. When videos are flagged to us that break our policies, we quickly remove them."

Read the original article on Business Insider